238 research outputs found

    Tight bounds on the maximal perimeter of convex equilateral small polygons

    Full text link
    A small polygon is a polygon that has diameter one. The maximal perimeter of a convex equilateral small polygon with n=2sn=2^s sides is not known when s4s \ge 4. In this paper, we construct a family of convex equilateral small nn-gons, n=2sn=2^s and s4s \ge 4, and show that their perimeters are within O(1/n4)O(1/n^4) of the maximal perimeter and exceed the previously best known values from the literature. In particular, for the first open case n=16n=16, our result proves that Mossinghoff's equilateral hexadecagon is suboptimal

    Pooling problem: Alternate formulations and solution methods

    Get PDF
    Copyright @ 2004 INFORMSThe pooling problem, which is fundamental to the petroleum industry, describes a situation in which products possessing different attribute qualities are mixed in a series of pools in such a way that the attribute qualities of the blended products of the end pools must satisfy given requirements. It is well known that the pooling problem can be modeled through bilinear and nonconvex quadratic programming. In this paper, we investigate how best to apply a new branch-and-cut quadratic programming algorithm to solve the pooling problem. To this effect, we consider two standard models: One is based primarily on flow variables, and the other relies on the proportion. of flows entering pools. A hybrid of these two models is proposed for general pooling problems. Comparison of the computational properties of flow and proportion models is made on several problem instances taken from the literature. Moreover, a simple alternating procedure and a variable neighborhood search heuristic are developed to solve large instances and compared with the well-known method of successive linear programming. Solution of difficult test problems from the literature is substantially accelerated, and larger ones are solved exactly or approximately.This project was funded by Ultramar Canada and Luc Massé. The work of C. Audet was supported by NSERC (Natural Sciences and Engineering Research Council) fellowship PDF-207432-1998 and by CRPC (Center for Research on Parallel Computation). The work of J. Brimberg was supported by NSERC grant #OGP205041. The work of P. Hansen was supported by FCAR(Fonds pour la Formation des Chercheurs et l’Aide à la Recherche) grant #95ER1048, and NSERC grant #GP0105574

    derivative-free nonlinear programming

    Get PDF
    algorithm with a progressive barrier fo

    Quantifying uncertainty with ensembles of surrogates for blackbox optimization

    Full text link
    This work is in the context of blackbox optimization where the functions defining the problem are expensive to evaluate and where no derivatives are available. A tried and tested technique is to build surrogates of the objective and the constraints in order to conduct the optimization at a cheaper computational cost. This work proposes different uncertainty measures when using ensembles of surrogates. The resulting combination of an ensemble of surrogates with our measures behaves as a stochastic model and allows the use of efficient Bayesian optimization tools. The method is incorporated in the search step of the mesh adaptive direct search (MADS) algorithm to improve the exploration of the search space. Computational experiments are conducted on seven analytical problems, two multi-disciplinary optimization problems and two simulation problems. The results show that the proposed approach solves expensive simulation-based problems at a greater precision and with a lower computational effort than stochastic models.Comment: 36 pages, 11 figures, submitte

    A general mathematical framework for constrained mixed-variable blackbox optimization problems with meta and categorical variables

    Full text link
    A mathematical framework for modelling constrained mixed-variable optimization problems is presented in a blackbox optimization context. The framework introduces a new notation and allows solution strategies. The notation framework allows meta and categorical variables to be explicitly and efficiently modelled, which facilitates the solution of such problems. The new term meta variables is used to describe variables that influence which variables are acting or nonacting: meta variables may affect the number of variables and constraints. The flexibility of the solution strategies supports the main blackbox mixed-variable optimization approaches: direct search methods and surrogate-based methods (Bayesian optimization). The notation system and solution strategies are illustrated through an example of a hyperparameter optimization problem from the machine learning community

    Sequential stochastic blackbox optimization with zeroth-order gradient estimators

    Full text link
    This work considers stochastic optimization problems in which the objective function values can only be computed by a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based on sequential stochastic optimization (SSO): the original problem is decomposed into a sequence of subproblems. Each of these subproblems is solved using a zeroth order version of a sign stochastic gradient descent with momentum algorithm (ZO-Signum) and with an increasingly fine precision. This decomposition allows a good exploration of the space while maintaining the efficiency of the algorithm once it gets close to the solution. Under Lipschitz continuity assumption on the blackbox, a convergence rate in expectation is derived for the ZO-signum algorithm. Moreover, if the blackbox is smooth and convex or locally convex around its minima, a convergence rate to an ϵ\epsilon-optimal point of the problem may be obtained for the SSO algorithm. Numerical experiments are conducted to compare the SSO algorithm with other state-of-the-art algorithms and to demonstrate its competitiveness

    Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm

    Get PDF
    This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSD-MADS) is an asynchronous parallel algorithm in which the processes solve problems over subsets of variables. The convergence analysis based on the Clarke calculus is essentially the same as for the MADS algorithm. A practical implementation is described and some numerical results on problems with up to 500 variables illustrate advantages and limitations of PSD-MADS
    corecore